Перевод: с английского на русский

с русского на английский

average information content

См. также в других словарях:

  • information entropy — noun A measure of the uncertainty associated with a random variable ; a measure of the average information content one is missing when one does not know the value of the random variable ; usually in units such as bits. A passphrase is similar to… …   Wiktionary

  • Information theory — Not to be confused with Information science. Information theory is a branch of applied mathematics and electrical engineering involving the quantification of information. Information theory was developed by Claude E. Shannon to find fundamental… …   Wikipedia

  • Entropy (information theory) — In information theory, entropy is a measure of the uncertainty associated with a random variable. The term by itself in this context usually refers to the Shannon entropy, which quantifies, in the sense of an expected value, the information… …   Wikipedia

  • Information addiction — is a condition whereby connected users experience a hit of pleasure, stimulation and escape and technology affects attention span, creativity and focus Richtel, M: The Lure of Data: Is It Addictive? , The New York Times , 6 July 2003] which has… …   Wikipedia

  • Information and communication technologies for development — An OLPC class in Ulaanbaatar, Mongolia …   Wikipedia

  • information theory — the mathematical theory concerned with the content, transmission, storage, and retrieval of information, usually in the form of messages or data, and esp. by means of computers. [1945 50] * * * ▪ mathematics Introduction       a mathematical… …   Universalium

  • Entropy in thermodynamics and information theory — There are close parallels between the mathematical expressions for the thermodynamic entropy, usually denoted by S , of a physical system in the statistical thermodynamics established by Ludwig Boltzmann and J. Willard Gibbs in the 1870s; and the …   Wikipedia

  • Quantities of information — A simple information diagram illustrating the relationships among some of Shannon s basic quantities of information. The mathematical theory of information is based on probability theory and statistics, and measures information with several… …   Wikipedia

  • Semiotic information theory — considers the information content of signs and expressions as it is conceived within the semiotic or sign relational framework developed by Charles Sanders Peirce.Information and uncertaintyThe good of information is its use in reducing our… …   Wikipedia

  • Geographic information system — GIS redirects here. For other uses, see GIS (disambiguation). A geographic information system, geographical information science, or geospatial information studies is a system designed to capture, store, manipulate, analyze, manage, and present… …   Wikipedia

  • Self-information — In information theory (elaborated by Claude E. Shannon, 1948), self information is a measure of the information content associated with the outcome of a random variable. It is expressed in a unit of information, for example bits, nats, or… …   Wikipedia

Поделиться ссылкой на выделенное

Прямая ссылка:
Нажмите правой клавишей мыши и выберите «Копировать ссылку»